80,796 research outputs found
Scalable Importance Tempering and Bayesian Variable Selection
We propose a Monte Carlo algorithm to sample from high dimensional
probability distributions that combines Markov chain Monte Carlo and importance
sampling. We provide a careful theoretical analysis, including guarantees on
robustness to high dimensionality, explicit comparison with standard Markov
chain Monte Carlo methods and illustrations of the potential improvements in
efficiency. Simple and concrete intuition is provided for when the novel scheme
is expected to outperform standard schemes. When applied to Bayesian
variable-selection problems, the novel algorithm is orders of magnitude more
efficient than available alternative sampling schemes and enables fast and
reliable fully Bayesian inferences with tens of thousand regressors.Comment: Online supplement not include
Particle Gibbs with Ancestor Sampling
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining
the two main tools used for Monte Carlo statistical inference: sequential Monte
Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a novel PMCMC
algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS).
PGAS provides the data analyst with an off-the-shelf class of Markov kernels
that can be used to simulate the typically high-dimensional and highly
autocorrelated state trajectory in a state-space model. The ancestor sampling
procedure enables fast mixing of the PGAS kernel even when using seemingly few
particles in the underlying SMC sampler. This is important as it can
significantly reduce the computational burden that is typically associated with
using SMC. PGAS is conceptually similar to the existing PG with backward
simulation (PGBS) procedure. Instead of using separate forward and backward
sweeps as in PGBS, however, we achieve the same effect in a single forward
sweep. This makes PGAS well suited for addressing inference problems not only
in state-space models, but also in models with more complex dependencies, such
as non-Markovian, Bayesian nonparametric, and general probabilistic graphical
models
Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models
In this paper Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCMC) posterior analysis of the parameters of SV models can be performed. --Dynamic Latent Variables,Markov Chain Monte Carlo,Maximum likelihood,Simulation Smoother
- …